Occam Algorithms for Learning From Noisy Examples
نویسنده
چکیده
In the distribution-independent model of concept learning from examples introduced by Valiant [Va184], it has been shown that the existence of an Occam algorithm for a dass of concepts implies the computationally feasible (polynomial) learnability of that class $[BEHW87a, BEHW87b]$. An Occam algorithm is a polynomial-time algorithm that produces, for any sequence of examples, a nearly minimum hypothesis consistent with the examples. These works, however, depend strongly on the assumption of perfect, noiseless examples. This assumption is generally unrealistic and in many situations of the real world there is always some chance that a noisy example is given to the learning algorithm. In this paper we present a practical extension to Occam algorithms in the Valiant learnability model: Occam algorithms that can tolerate the classification noise, a noise process introduced in [AL88] (classifying the example is subject to independent random mistakes with some small probability), and it is shown that the noise-tolerant Occam algorithm is a powerful algorithmic tool to establish computationally feasibk learning that copes with the classification noise; the existence of a noise-tolerant Occam algorithm for a class of concepts is a sufficient condition for the polynomial learnability of that class in the presence of noise.
منابع مشابه
Occam and Compression Algorithms: General PAC Learning Results
These notes are slightly edited from scribe notes in previous years. Please consult the handout of slide copies for definitions and theorem statements. Theorem 0 from Handout Let X be a domain of examples, and C, H concept classes over X. Let A be a learning algorithm such that ∀c ∈ C A takes a sample of m examples and outputs a hypothesis h ∈ H consistent with sample S. Then when using a sampl...
متن کاملAgnostic Learning and Structural Risk Minimization
In this lecture we address some limitations of the analysis of Occam algorithms that limit their applicability. We first discuss the case where the target concept c is not in H which is known as the non-realizable case and as agnostic PAC learning. We then turn to the case where the number of examples m is fixed (i.e., we cannot ask for more examples as in the standard PAC model) and consider h...
متن کاملPartial Occam's Razor and Its Applications
We introduce the notion of \partial Occam algorithm". A partial Oc-cam algorithm produces a succinct hypothesis that is partially consistent with given examples, where the proportion of consistent examples is a bit more than half. By using this new notion, we propose one approach for obtaining a PAC learning algorithm. First, as shown in this paper, a partial Occam algorithm is equivalent to a ...
متن کاملL Partial Occam's Razor and Its Applications Partial Occam's Razor and Its Applications
We introduce the notion of \partial Occam algorithm". A partial Occam algorithm produces a succinct hypothesis that is partially consistent with given examples, where the proportion of consistent examples is a bit more than half. By using this new notion, we propose one approach for obtaining a PAC learning algorithm. First, as shown in this paper, a partial Occam algorithm is equivalent to a w...
متن کاملA Markovian Extension of Valiant's Learning Model (Extended Abstract)
Formalizing the process of natural induction and justifying its predictive value is not only basic to the philosophy of science, but is also an essential ingredient of any formal theory of learning. A formulation of the process of induction which is intuitively compelling and reasonable from a computational point of view is arrived a t by equating the process of induction with Occam Algorithms ...
متن کامل